# Wikipedia optimization
Rum2m100 1.2B
MIT
A Russian spelling check model trained based on M2M100-1.2B, capable of correcting spelling and typing errors
Machine Translation
Transformers Other

R
ai-forever
407
10
Geppetto
GPT2 model pre-trained for Italian language, 117M parameters, trained on Italian Wikipedia and ItWac corpus
Large Language Model Other
G
LorenzoDeMattei
78.22k
15
Distilbert Base Uk Cased
Apache-2.0
This is a Ukrainian-specific version of the distilbert-base-multilingual-cased model, retaining the original model's representational capabilities and accuracy.
Large Language Model
Transformers Other

D
Geotrend
20
2
Featured Recommended AI Models